111

is called a random Markov chain (also Markov process, after Andrei Andreyevich Markov;

other spellings Markov chain, Markoff chain, Markof chain). By knowing only a limited

antecedent history (e.g. last litter), only as good prognoses about the future development

will be possible as by knowing the whole antecedent history. But which ones? Here we see

amazing things: while we can’t predict the next litter at all (it’s random, after all), we can

predict the outcome space for all futures: It can only be a one through six.

So we see: with random systems, everything is not predictable at all in the short term.

But the long-term prediction is surprisingly simple. The entire value frame is swept

according to a random function (in our case: one sixth of the dice is a one, a two, a three

... a six; a more complex solution for a different system could be a Gaussian distribution) -

this is the description of the entire future. Finally however, the result can never be con­

trolled, a random system is and remains random.

And what about the biological systems? The fascinating thing is that the biological

systems are a mixture between both extremes, between randomness and total order. These

systems are called chaotic systems. We have already seen an example from everyday life:

the weather. Here there is only a more or less certain weather forecast for the next day or

even 2 weeks, but no certainty for longer periods. On the other hand, the result space is

quite fixed, namely the climate of the place, which is e.g. temperate and sets the frame of

possible weather forecasts.

­

9.3

­

9.3  Typical Behaviour of Systems